% scribe: Alex Skorokhod % lastupdate: 6 December 2005 % lecture: 24 % references: Durrett, sections 4.4, 4.5 and 4.6 % title: L^1 convergence, uniform integrability, reversed MG % keywords: uniform integrability, L^1 convergence, reversed martingales % end % subjects: stopping times, martingales, gambler's ruin, Wald's identity, martingale transform, filtrations, optional times, martingale property \documentclass[12pt, letterpaper]{article} \include{macros} \usepackage{floatflt} \def\locv{\stackrel{\scriptscriptstyle\L^1}{\longrightarrow}} %L1 cvgce \begin{document} \lecture{24}{$\L^1$-convergence and reversed MG}{Alex Skorokhod}{skor@math} References: \cite{durrett}, sections 4.4, 4.5 and 4.6. \section{Martingale Convergence} Recall Dubins' inequality from previous lecture: for a non-negative supermartingale $S_n$, $n=0,1,2,\ldots$ and $00$ and $S_n=X_1+\ldots+X_n$ where $X_i$ are i.i.d with $\E(X)<\infty$. Let $T_0=\inf\{n:S_n\leq0\}$ and consider $S_{n\wedge T_0}$. In order to avoid overshooting and guarantee $S_{T_0}=0$ (on $T_0<\infty$), assume that $X_i\in \{-1,0,1,\ldots\}$. We also want to exclude an always constant process, so assume $\P(X=0)<1$. The Strong Law of Large Numbers tells us that $S_n/n\ascv \E(X_1)$. Set $\E(X)=0$ which makes $S_{n\wedge T_0}$ a non-negative super-martingale, hence it has to converge a.s. On the event $\{T_0=\infty\}$, however, we must have $X_i\neq 0 \text { i.o.}$ and hence we can't have convergence on this set. The same holds true for a branching process at critical value. \section{More on $\L^2$ Convergence} Consider an $L^2$-bounded martingale $(M_n,\F_n)$ (that is, with $\sup_n\E(M_n^2)<\infty$). Recall that $M_n$ has orthogonal increments, hence $M_n$ converges in $L^2$ to some limit $M_\infty$: \begin{equation} M_n\ltcv M_\infty \mbox{ and } \E M_{\infty}^2=\sup_n \E M_n^2 \end{equation} However now we can prove even more - a.s.\ convergence. There are two ways to prove it. One is to use a version of Kolmogorov's inequality; the other one is to use the martingale convergence theorem and use our control over second moment: $$\sup_n(\E M_n^-)\leq\sup_n\E |M_n|\leq\sup_n\sqrt{\E M_n^2}\leq\sqrt{\sup_n\E M_n^2}<\infty.$$ %\begin{proposition} %For $M_n$ as above %$$M_n\ascv M_\infty$$ %\end{proposition} \begin{claim} \begin{equation} M_n=\E(M_\infty|\F_n) \end{equation} In other words, every $\L^2$-bounded martingale is a sequence of conditional expectations of some target r.v. $M_\infty$. \end{claim} \begin{proof} We know that $M_n=\E(M_N|\F_n)$ for $N>n$ and we have $M_n\ltcv M_\infty$. We want to show that (for ${\cal E}=\F_n$), % $$\E(M_n|{\cal E})\ltcv\E(M_\infty|{\cal E}).$$ % The property of $\E(\cdot|{\cal E})$ that gives it to us is continuity of $\E(\cdot|{\cal E})$ when viewed as an operator on $\L^2$. It's a simple fact from functional analysis. This operator is linear and also bounded since it's norm is 1: $$E(X|\F)^2\leq \E(X|\F)^2+\E(X-E(X|F))^2=\E(X^2)$$ and $1$ is achieved for constant $X$. \end{proof} Note that the above result is dominated by Doob's result on martingale convergence which follows from a Kolmogorov-like inequality as mentioned above. The key fact obtained from it is $$\E\!\left(\left(\sup_n M_n\right)^2\right)\leq 4\sup_n(\E M_n^2)$$ A similar inequality exists for general $\L^p$ for $p>1$; however for $p=1$ a constant blows up and analysis gets more difficult. \section{Convergence in $\L^1$} Consider a sequence $M_n=\E(M_\infty |\F_n)$ for some $M_\infty\in\L^1$ and filtration $\F_n$. Such a process is a martingale. Let's ask a question of which martingales have this form (``closed in $L^1$''). \begin{theorem} \label{unifintequiv} Given probability space $(\Omega,\F,\P)$ and filtration $\F_n$ the following are equivalent: \begin{enumerate} \item There exists $M_\infty\in L^1$ such that $M_n=\E(M_\infty|\F_n)$; \item $(M_n,\F_n)$ is a martingale which converges in $L^1$ (and it converges to $M_\infty$); and \item $(M_n,\F_n)$ is a uniformly integrable martingale. \end{enumerate} \end{theorem} Let's define uniform integrability, which is somewhat similar to the concept of tightness encountered earlier. \begin{definition} A collection of r.v.\ $(X_i, i\in I)$ is \emph{uniformly integrable} if $$\lim_{x \toinf}\sup_i \E(|X_i|\1_{|X_i|>x})=0.$$ \end{definition} See text (\cite{durrett}, 4.5) for the proof of the following properties of uniform integrability which is the last word on swapping expectations and limits: \begin{theorem} If $X_n\ascv X$ and $(X_n)$ is uniformly integrable then: \begin{enumerate} \item $\E|X|<\infty$; and \item $X_n\locv X$ and hence $\E (X_n)\rightarrow \E (X)$. \end{enumerate} Moreover if $X_n\geq0$, the converse is true: \begin{equation} \E\!\left(\lim_n X_n\right) = \lim\E (X_n) \Leftrightarrow (X_n) \mbox{is uniformly integrable}. \end{equation} \end{theorem} Note that if $X_n$ converges in $\L^1$ it is automatically uniformly integrable. \begin{proof} (of Thm. \ref{unifintequiv}) \begin{enumerate} \item $(1)\Rightarrow (3)$ Follows from a more general fact (see \cite{durrett}, Chapter 4, (5.1)): Given $(\Omega,\F_0,\P)$ and an $X\in \L^1$, the family $\{\E(X|\F):\F \text{ is a sub-$\sigma$-field}\subset\F_0\}$ is uniformly integrable. \item $(3)\Rightarrow (2)$ Since $M_n$ converges a.s.\ and is uniformly integrable, it converges in $\L^1$. \item $(2)\Rightarrow (1)$ Treat the same way as $\L^2$ case and use continuity of $\E(\cdot|\F)$ as an operator. \end{enumerate} \end{proof} To show that uniform integrability is essential, consider simple random walk $S_n$ started at 1 and stopped at first zero ($\P(T_0<\infty)=1$). Hence $S_{n\wedge T_0}=0$ for $n\geq T_0$ and $S_{n\wedge T_0}\ascv 0$. However $\E(S_{n\wedge T_0})=1\neq \E(\lim_{n\toinf}S_{n\wedge T_0})=0$. Hence we can conclude that $S_{n\wedge T_0}$ is not uniformly integrable and does not converge in $\L^1$. \section{Reversed Martingales} These arise as $\E(X|\G_n)$ where $\G_n$ is a \emph{decreasing} rather than increasing sequence of $\sigma-$fields. Sometimes they are also called \emph{backwards} martingales. \begin{example} If $\G_n=\sigma (X_n,X_{n+1},\ldots)$, then $\G_n\downarrow {\cal T}(X_1,X_2,\ldots)$. \end{example} In general if $\G_n\downarrow \G_\infty :=\bigcap_n \G_n$ then for $X\in\L^1$ the following convergence is true in $\L^1$ and a.s.\ sense: \begin{equation} \E(X|\G_n)\longrightarrow \E(X|\G_\infty). \end{equation} The proof is by using the upcrossing inequality. \begin{definition} A sequence of random variables $X_i$ is called \emph{exchangeable} if for all $n$ and any permutation $\pi$ on $n$ objects we have: $$(X_1,\ldots,X_n)\deq (X_{\pi(1)},\ldots,X_{\pi (n)}).$$ \end{definition} The condition of exchangeability is stronger than the assumption of identical distribution of the individual random variables in the sequence, and weaker than the assumption that they are independent and identically distributed. \begin{example} \begin{enumerate} \item All $X_i$'s are i.i.d. \item Pick $F$ a random probability distribution on $\R$. Pick $X_i$ as i.i.d.\ with distribution $F$. \end{enumerate} \end{example} A famous converse to a previous example is the following theorem: \begin{theorem}[de Finetti] Every exchangeable sequence of real-valued r.v. has the same distribution as some mixture of i.i.d. \end{theorem} Let ${\cal E}_n$ be the $\sigma$-field generated by events which are invariant under permutations that leave $n+1,n+2,\ldots$ fixed. Observe that ${\cal E}_n \downarrow$ as $n\uparrow$, hence ${\cal E}_n \downarrow{\cal E}_\infty$ which is called the \emph{exchangeable $\sigma$-field}. Note that ${\cal E}_\infty$ is richer than the tail-field (in other words, ${\cal T}(X_1,X_2,\ldots)\subseteq{\cal E}_\infty$). For example, consider $S_n=X_1+\ldots +X_n$ for exchangeable $X_i$. Exchangeability yields $$\E(X_1|{\cal E}_n)=\E(X_2|{\cal E}_n)=\ldots = \E(X_n|{\cal E}_n)$$ Adding up all terms and dividing by $n$ gives us $$\frac{\E(S_n|{\cal E}_n)}{n}=\E(X_1|{\cal E}_n)\Rightarrow \frac{S_n}{n}$$ since $S_n$ is ${\cal E}_n$-measurable. Hence we obtain a new proof of the SLLN: $$\frac{S_n}{n}\ascv \E(X_1|{\cal E}_\infty)=\E(X_1)$$ since ${\cal E}_\infty$ is trivial by Hewitt-Savage 0-1 law. \bibliographystyle{plain} \bibliography{../books.bib} \end{document}